1,570 research outputs found
Jet properties from di-hadron correlations in p+p collisions at s**(1/2) = 200-GeV
An analysis of high pT hadron spectra associated with high pT
particles in p+p collisions at s**(1/2) = 200-GeV is presented. The shape of
the azimuthal angular correlation is used to determine the value of partonic
intrinsic momentum \sqrt{\left} = 2.68 \pm 0.07(\rm stat) \pm
0.34(\rm sys) GeV/c. The effect of kT-smearing of inclusive cross
section is discussed.Comment: To appear in the proceedings of 2nd International Conference on Hard
and Electromagnetic Probes of High-Energy Nuclear Collisions (Hard Probes
2006), Asilomar, Pacific Grove, California, 9-16 Jun 200
Materialisierte views in verteilten key-value stores
Distributed key-value stores have become the solution of choice for warehousing large volumes of data. However, their architecture is not suitable for real-time analytics. To achieve the required velocity, materialized views can be used to provide summarized data for fast access. The main challenge then, is the incremental, consistent maintenance of views at large scale. Thus, we introduce our View Maintenance System (VMS) to maintain SQL queries in a data-intensive real-time scenario.Verteilte key-value stores sind ein Typ moderner Datenbanken um große Mengen an Daten zu verarbeiten. Trotzdem erlaubt ihre Architektur keine analytischen Abfragen in Echtzeit. Materialisierte Views können diesen Nachteil ausgleichen, indem sie schnellen Zuriff auf Ergebnisse ermöglichen. Die Herausforderung ist dann, das inkrementelle und konsistente Aktualisieren der Views. Daher präsentieren wir unser View Maintenance System (VMS), das datenintensive SQL Abfragen in Echtzeit berechnet
Digital soil mapping of copper in Sweden: Using the prediction and uncertainty as decision support in crop micronutrient management
Digital soil mapping (DSM) of topsoil copper (Cu) concentrations and prediction intervals covering 90% of agricultural land in Sweden was performed, in order to identify areas at risk of Cu deficiency. A total of 12,527 soil samples were used to calibrate the DSM model, using airborne gamma radiation data, climate data, topographical data and soil texture class data. Among the samples included, 11,093 had no laboratory-analysed Cu concentrations, so their Cu concentrations were predicted using portable X-ray fluorescence (PXRF) measurements. Cross-validation of the PXRF model resulted in Nash-Sutcliffe model efficiency coefficient (E) of 0.66 and mean absolute error (MAE) of 3.3 mg kg−1. Cross-validation of the DSM model showed somewhat lower performance (E = 0.57, MAE = 4.1 mg kg−1). Based on the lower bound of the prediction interval (5th percentile), 48% of agricultural soils in Sweden are most likely not at risk of Cu deficiency (>7 mg kg−1). The Cu map was also validated against concentrations in soil samples from five fields (25–47 ha in size; four samples per ha). The field means were predicted with a MAE of 1.0 mg kg−1 and within-field variation was reproduced with a field-wise squared Pearson correlation coefficient (r2) of 0–0.36. The classification metric ‘recall’ showed that the map of soil Cu concentrations might not predict all possible areas at risk of being Cu deficient, as observational data indicates that about 22% of soils in the mapped area should have Cu concentrations below the risk limit. However, the metric ‘precision’ showed that when the soil map predicted a concentration at or below 7 mg kg−1, it was generally correct. Increasing the limit resulted in the recall and precision increasing rapidly. The remaining 52% of agricultural soils at risk of being below the Cu concentration limit can be targeted by laboratory analysis or monitoring
Collisional energy loss and the suppression of high hadrons
We calculate nuclear suppression factor () for light hadrons by
taking only the elastic processes and argue that in the measured domain
of RHIC, collisional rather than the radiative processes is the dominant
mechanism for partonic energy loss.Comment: Presented at the international conference on strong and electroweak
matter 2006, May 10-13, Brookhaven National Laborator
Probing collectivity in ultra-relativistic heavy ion collision by leptons and photons
It has been shown that the evolution of collectivity in ultra-relativistic
heavy ion collision is manifested in the variation of various HBT radii with
invariant mass () extracted from the correlation functions of two lepton
pairs. The value of the radial velocity () can be estimated from the ratio
of the distributions of single photons to lepton pairs for various
windows. It has been argued that the variation of radial flow with appropriate
kinematic variables can be used as an indicator of a phase transition from
initially produced partons to hadrons. We also consider the elliptic flow
() of the matter as probed by the single electron spectra originating
from the semileptonic decays of heavy mesons. The measured values of
and the nuclear suppression factor () at RHIC energy have been
reproduced simultaneously by including both the collisional and radiative
processes within the scope of perturbative quantum chromodynamics. The
and have been predicted for LHC energy.Comment: Plenary talk at ICPAQGP 2010, December 6-10, 2011, Goa, Indi
STAR inner tracking upgrade - A performance study
Anisotropic flow measurements have demonstrated development of partonic
collectivity in Au+Au collisions at RHIC. To understand the
partonic EOS, thermalization must be addressed. Collective motion of
heavy-flavor (c,b) quarks can be used to indicate the degree of thermalization
of the light-flavor quarks (u,d,s). Measurement of heavy-flavor quark
collectivity requires direct reconstruction of heavy-flavor hadrons in the low
\pt region. Measurement of open charm spectra to high \pt can be used to
investigate heavy-quark energy loss and medium properties. The Heavy Flavor
Tracker (HFT), a proposed upgrade to the STAR experiment at midrapidity, will
measure of open-charm hadrons to very low \pt by reconstructing their
displaced decay vertices. The innermost part of the HFT is the PIXEL detector
(made of two low mass monolithic active pixel sensor layers), which delivers a
high precision position measurement close to the collision vertex. The
Intermediate Silicon Tracker (IST), a 1-layer strip detector, is essential to
improve hit identification in the PIXEL detector when running at full RHIC-II
luminosity. Using a full GEANT simulation, open charm measurement capabilities
of STAR with the HFT will be shown. Its performance in a broad \pt range will
be demonstrated on (\pt > 0.5\mathrm{GeV}/c) and
(\pt < 10\mathrm{GeV}/c) measurements of \D meson. Results of
reconstruction of \Lc baryon in heavy-ion collisions are presented.Comment: to appear in EPJ C (Hot Quarks 2008 conference volume
Probing Cosmology with Weak Lensing Minkowski Functionals
In this paper, we show that Minkowski Functionals (MFs) of weak gravitational
lensing (WL) convergence maps contain significant non-Gaussian,
cosmology-dependent information. To do this, we use a large suite of
cosmological ray-tracing N-body simulations to create mock WL convergence maps,
and study the cosmological information content of MFs derived from these maps.
Our suite consists of 80 independent 512^3 N-body runs, covering seven
different cosmologies, varying three cosmological parameters Omega_m, w, and
sigma_8 one at a time, around a fiducial LambdaCDM model. In each cosmology, we
use ray-tracing to create a thousand pseudo-independent 12 deg^2 convergence
maps, and use these in a Monte Carlo procedure to estimate the joint confidence
contours on the above three parameters. We include redshift tomography at three
different source redshifts z_s=1, 1.5, 2, explore five different smoothing
scales theta_G=1, 2, 3, 5, 10 arcmin, and explicitly compare and combine the
MFs with the WL power spectrum. We find that the MFs capture a substantial
amount of information from non-Gaussian features of convergence maps, i.e.
beyond the power spectrum. The MFs are particularly well suited to break
degeneracies and to constrain the dark energy equation of state parameter w (by
a factor of ~ three better than from the power spectrum alone). The
non-Gaussian information derives partly from the one-point function of the
convergence (through V_0, the "area" MF), and partly through non-linear spatial
information (through combining different smoothing scales for V_0, and through
V_1 and V_2, the boundary length and genus MFs, respectively). In contrast to
the power spectrum, the best constraints from the MFs are obtained only when
multiple smoothing scales are combined.Comment: 19 pages, 9 figures, 5 table
EDS tomographic reconstruction regularized by total nuclear variation joined with HAADF-STEM tomography
Energy-dispersive X-ray spectroscopy (EDS) tomography is an advanced technique to characterize compositional information for nanostructures in three dimensions (3D). However, the application is hindered by the poor image quality caused by the low signal-to-noise ratios and the limited number of tilts, which are fundamentally limited by the insufficient number of X-ray counts. In this paper, we explore how to make accurate EDS reconstructions from such data. We propose to augment EDS tomography by joining with it a more accurate high-angle annular dark-field STEM (HAADF-STEM) tomographic reconstruction, for which usually a larger number of tilt images are feasible. This augmentation is realized through total nuclear variation (TNV) regularization, which encourages the joint EDS and HAADF reconstructions to have not only sparse gradients but also common edges and parallel (or antiparallel) gradients. Our experiments show that reconstruction images are more accurate compared to the non-regularized and the total variation regularized reconstructions, even when the number of tilts is small or the X-ray counts are low
Improving ICD-based semantic similarity by accounting for varying degrees of comorbidity
Finding similar patients is a common objective in precision medicine,
facilitating treatment outcome assessment and clinical decision support.
Choosing widely-available patient features and appropriate mathematical methods
for similarity calculations is crucial. International Statistical
Classification of Diseases and Related Health Problems (ICD) codes are used
worldwide to encode diseases and are available for nearly all patients.
Aggregated as sets consisting of primary and secondary diagnoses they can
display a degree of comorbidity and reveal comorbidity patterns. It is possible
to compute the similarity of patients based on their ICD codes by using
semantic similarity algorithms. These algorithms have been traditionally
evaluated using a single-term expert rated data set.
However, real-word patient data often display varying degrees of documented
comorbidities that might impair algorithm performance. To account for this, we
present a scale term that considers documented comorbidity-variance. In this
work, we compared the performance of 80 combinations of established algorithms
in terms of semantic similarity based on ICD-code sets. The sets have been
extracted from patients with a C25.X (pancreatic cancer) primary diagnosis and
provide a variety of different combinations of ICD-codes. Using our scale term
we yielded the best results with a combination of level-based information
content, Leacock & Chodorow concept similarity and bipartite graph matching for
the set similarities reaching a correlation of 0.75 with our expert's ground
truth. Our results highlight the importance of accounting for comorbidity
variance while demonstrating how well current semantic similarity algorithms
perform.Comment: 11 pages, 6 figures, 1 tabl
- …